A practical comparison between Quickprop and back-propagation

نویسندگان

  • Sam Waugh
  • Anthony Adams
چکیده

A number of studies have questioned the validity of the Quickprop algorithm and the activation function derivative offset as appropriate methods of speeding backpropagation style training. This paper follows on from those results with a practical study using generated data sets. The generation method is briefly presented, along with the results of comparisons which indicate, at least on these data sets, that the performance of Quickprop is not as good as standard back-propagation, and that the activation function derivative offset is detrimental to learning.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

The Cascade-Correlation Learning Architecture

Cascade-Correlation is a new architecture and supervised learning algorithm for artificial neural networks. Instead of just adjusting the weights in a network of fixed topology, Cascade-Correlation begins with a minimal network, then automatically trains and adds new hidden units one by one, creating a multi-layer structure. Once a new hidden unit has been added to the network, its input-side w...

متن کامل

Compensation of intra-channel nonlinear fibre impairments using simplified digital back-propagation algorithm.

We investigate a digital back-propagation simplification method to enable computationally-efficient digital nonlinearity compensation for a coherently-detected 112 Gb/s polarization multiplexed quadrature phase shifted keying transmission over a 1,600 km link (20 x 80 km) with no inline compensation. Through numerical simulation, we report up to 80% reduction in required back-propagation steps ...

متن کامل

Application of Linear Regression and Artificial NeuralNetwork for Broiler Chicken Growth Performance Prediction

This study was conducted to investigate the prediction of growth performance using linear regression and artificial neural network (ANN) in broiler chicken. Artificial neural networks (ANNs) are powerful tools for modeling systems in a wide range of applications. The ANN model with a back propagation algorithm successfully learned the relationship between the inputs of metabolizable energy (kca...

متن کامل

Neural Network Architectures and Learning

Abstract Various leaning method of neural networks including supervised and unsupervised methods are presented and illustrated with examples. General learning rule as a function of the incoming signals is discussed. Other learning rules such as Hebbian learning, perceptron learning, LMS Least Mean Square learning, delta learning, WTA – Winner Take All learning, and PCA Principal Component Analy...

متن کامل

Comparison of Inductive Learning of Classification Tasks by Neural Networks*

A number of different data sets are used to compare a variety of neural network training algorithms: backpropagation, quickprop, committees of backpropagation style networks and Cascade Correlation. The results are further compared with a decision tree technique, C4.5, to assess which types of problems are more suited to the different classes of inductive learning algorithms.

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1997